Convex Total Least Squares
نویسندگان
چکیده
We study the total least squares (TLS) problem that generalizes least squares regression by allowing measurement errors in both dependent and independent variables. TLS is widely used in applied fields including computer vision, system identification and econometrics. The special case when all dependent and independent variables have the same level of uncorrelated Gaussian noise, known as ordinary TLS, can be solved by singular value decomposition (SVD). However, SVD cannot solve many important practical TLS problems with realistic noise structure, such as having varying measurement noise, known structure on the errors, or large outliers requiring robust error-norms. To solve such problems, we develop convex relaxation approaches for a general class of structured TLS (STLS). We show both theoretically and experimentally, that while the plain nuclear norm relaxation incurs large approximation errors for STLS, the re-weighted nuclear norm approach is very effective, and achieves better accuracy on challenging STLS problems than popular non-convex solvers. We describe a fast solution based on augmented Lagrangian formulation, and apply our approach to an important class of biological problems that use population average measurements to infer cell-type and physiological-state specific expression levels that are very hard to measure directly.
منابع مشابه
A Projected Alternating Least square Approach for Computation of Nonnegative Matrix Factorization
Nonnegative matrix factorization (NMF) is a common method in data mining that have been used in different applications as a dimension reduction, classification or clustering method. Methods in alternating least square (ALS) approach usually used to solve this non-convex minimization problem. At each step of ALS algorithms two convex least square problems should be solved, which causes high com...
متن کاملRegularization in Regression with Bounded Noise: A Chebyshev Center Approach
We consider the problem of estimating a vector z in the regression model b = Az+w, where w is an unknown but bounded noise. As in many regularization schemes, we assume that an upper bound on the norm of z is available. To estimate z we propose a relaxation of the Chebyshev center, which is the vector that minimizes the worst-case estimation error over all feasible vectors z. Relying on recent ...
متن کاملA Canonical Process for Estimation of Convex Functions: the “invelope” of Integrated
A process associated with integrated Brownian motion is introduced that characterizes the limit behavior of nonparametric least squares and maximum likelihood estimators of convex functions and convex densities, respectively. We call this process “the invelope” and show that it is an almost surely uniquely defined function of integrated Brownian motion. Its role is comparable to the role of the...
متن کاملFinding a Global Optimal Solution for a Quadratically Constrained Fractional Quadratic Problem with Applications to the Regularized Total Least Squares
We consider the problem of minimizing a fractional quadratic problem involving the ratio of two indefinite quadratic functions, subject to a two sided quadratic form constraint. This formulation is motivated by the so-called Regularized Total Least Squares problem (RTLS). A key difficulty with this problem is its nonconvexity, and all current known methods to solve it are only guaranteed to con...
متن کاملComputational Experiments on the Tikhonov Regularization of the Total Least Squares Problem
In this paper we consider finding meaningful solutions of illconditioned overdetermined linear systems Ax ≈ b, where A and b are both contaminated by noise. This kind of problems frequently arise in discretization of certain integral equations. One of the most popular approaches to find meaningful solutions of such systems is the so called total least squares problem. First we introduce this ap...
متن کامل